16 research outputs found
Most probable magnetohydrostatic equilibria for tokamaks and reversed field pinches
The determination of magnetohydrostatic equilibria usually requires that two of the equilibrium functions be given. as there is usually no a priori basis for specifying the form of these two functions, the functions and the equilibria they determine may be considered random.;In this dissertation, the author reviews a recent statistical method for determining the equilibrium of an axially symmetric cylindrical plasma which is most probable (in the maximum entropy sense) given four global constraints (i.e., energy, magnetic helicity, longitudinal magnetic flux, and longitudinal current flux). Previous results from this model have been limited to non-negative random equilibrium functions (B(,z), J(,z), where B is the magnetic field and J is the current density), and to analytically derived solutions of the determining equations in which one constraint (magnetic helicity) has been relaxed.;The present work extends these results to the fully constrained problem by presenting numerically computed solutions of the governing equations. Some of these solutions are specialized to values of the constraints appropriate to tokamaks. States which are approximately force-free (B = J x const.) are shown to exist as solutions to the most probable state equations.;A further extension of the model is attempted in order to alleviate the restriction to non-negative random equilibrium functions. The extended model is applied to the problem of finding most probable equilibria with reversed magnetic fields. An examination of solutions constrained to different values of energy and magnetic helicity shows a tendency toward low pressure equilibria when the energy-to-helicity ratio is lowered. This result is consistent with the Bessel function model of reverse-field equilibria in which dynamical relaxation of the energy with respect to a fixed magnetic helicity results in pressureless, Bellel function equilibria.;A study is made of the influence of the pinch ratio, an experimental parameter, on the degree of magnetic field reversal in the most probable state model. The dependence of solutions on this parameter is found to be consistent qualitatively with experiments
Quantum Algorithm Implementations for Beginners
As quantum computers become available to the general public, the need has
arisen to train a cohort of quantum programmers, many of whom have been
developing classical computer programs for most of their careers. While
currently available quantum computers have less than 100 qubits, quantum
computing hardware is widely expected to grow in terms of qubit count, quality,
and connectivity. This review aims to explain the principles of quantum
programming, which are quite different from classical programming, with
straightforward algebra that makes understanding of the underlying fascinating
quantum mechanical principles optional. We give an introduction to quantum
computing algorithms and their implementation on real quantum hardware. We
survey 20 different quantum algorithms, attempting to describe each in a
succinct and self-contained fashion. We show how these algorithms can be
implemented on IBM's quantum computer, and in each case, we discuss the results
of the implementation with respect to differences between the simulator and the
actual hardware runs. This article introduces computer scientists, physicists,
and engineers to quantum algorithms and provides a blueprint for their
implementations
Recommended from our members
Data Standards for the Genomes to Life Program
Existing GTL Projects already have produced volumes of dataand, over the course of the next five years, will produce an estimatedhundreds, or possibly thousands, of terabytes of data from hundreds ofexperiments conducted at dozens of laboratories in National Labs anduniversities across the nation. These data will be the basis forpublications by individual researchers, research groups, andmulti-institutional collaborations, and the basis for future DOEdecisions on funding further research in bioremediation. The short-termand long-term value of the data to project participants, to the DOE, andto the nation depends, however, on being able to access the data and onhow, or whether, the data are archived. The ability to access data is thestarting point for data analysis and interpretation, data integration,data mining, and development of data-driven models. Limited orinefficient data access means that less data are analyzed in acost-effective and timely manner. Data production in the GTL Program willlikely outstrip, or may have already outstripped, the ability to analyzethe data. Being able to access data depends on two key factors: datastandards and implementation of the data standards. For the purpose ofthis proposal, a data standard is defined as a standard, documented wayin which data and information about the data are describe. The attributesof the experiment in which the data were collected need to be known andthe measurements corresponding to the data collected need to bedescribed. In general terms, a data standard could be a form (electronicor paper) that is completed by a researcher or a document that prescribeshow a protocol or experiment should be described in writing.Datastandards are critical to data access because they provide a frameworkfor organizing and managing data. Researchers spend significant amountsof time managing data and information about experiments using labnotebooks, computer files, Excel spreadsheets, etc. In addition, dataoutput format varies for different equipment and usually need to beformatted differently for the variety of computer programs used todisplay and analyze the data. If, however, data for a given type ofexperiment were converted from vendor format to a format defined by adata standard, then researchers and software developers could save time.In addition, if data and information describing how they were obtainedwere available in a consistent format throughout the GTL Program,comparison and integration of results would be facilitated and a datarepository could be built to encourage project-wide data mining.Datastandards also are essential for archiving data sets. If data are storedtogether with the experiment metadata (i.e., information about the data)in an 'information/data package', then the data retain their value due tothe accessibility of information about measurement and analysisprocedures.DOE's commitment to developing data standards for the GTLProgram is needed to ensure that the most value is obtained from DOE'sexpenditures on experimental work and to provide a data repository thatcan be used as the basis for on-going model development. By developingdata standards for experiments conducted as part of the GTL Program, DOEhas the opportunity to facilitate data sharing not only within the DOEcommunity, but also with research institutes through theworld
A vectorized particle tracer for unstructured grids
A vectorized particle tracer for unstructured grids is described. The basic approach is to use elementary properties of the linear basis functions to search for particles on the grid using the element last occupied as an initial guess. To permit vectorization, a simple binary sort of the particles is performed every timestep such that all particles that have as yet not found their host element remain at the top of the list. In this way, vector-loops can be easily formed. Timings taken from a numerical example indicate that speed-ups of the order of 1:14 can be obtained on vector-machines when using this algorithm
Electromagnetic PIC simulations using finite elements on unstructured grids
There is an increasing demand to use kinetic plasma simulations to model real plasma devices such as e-beam diodes, plasma torches, opening switches, and so on. With this demands comes the realization that standard simulation tools based on rectangular structured meshes are often too inflexible to accommodate real device geometries. To gain geometric flexibility, one typically must abandon structured rectangular meshes. Some have had considerable success by distorting the grid to conform to boundary-fitted curvilinear coordinates. This gives substantial flexibility while retaining a logically rectangular data structure. However, to achieve the greatest flexibility, an unstructured mesh is necessary. For the last several years, we have been engaged in designing electromagnetic particle-in-cell (PIC) simulations that can be performed on unstructured grids of triangles. We feel this is a powerful strategy for matching kinetic plasma simulations to real problem geometries. Unstructured meshes not only accommodate complicated boundary shapes with ease, but also allow extreme local refinement without affecting resolution elsewhere. The basic challenges to overcome in formulating EM-PIC on these new meshes are: (1) grid generation; (2) particle interpolation and tracking, and (3) field solution. In this paper we briefly describe our techniques and give a simple example
Electromagnetic particle codes on unstructured grids
The most widely used computational model of collisionless plasmas is the Lagrangian-Eulerian hybrid technique known as particle-in-cell or PIC. In the electromagnetic version, Maxwell's equations are solved on an Eulerian grid and electromagnetic forces are interpolated form the grid to particle locations. Particles are then moved in Lagrangian fashion while their currents are interpolated back onto the grid to provide sources for the fields on the next cycle. There are many applications where one needs to model plasmas and electromagnetic waves inside regions of complicated shape. Traditional methods for solving Maxwell's equations employ finite differences on regular grids to replace differential operators. These methods are awkward for complicated boundary shapes, often replacing smoothly curved or slanted boundaries with stairsteps. The desire to incorporate realistic boundaries into plasma simulations is motivated by a host of situations in which proper representation of the boundary shape is expected to be critical. Our approach to solving this problem is to design electromagnetic particle codes based on the use of unstructured grids. The arbitrary connectivity of unstructured grids provides the flexibility to place nodes wherever needed to fit the most complex boundary shapes. The most significant problems that must be addressed as a result of this strategy are: grid generation, field solution, and particle tracking. Our solutions to these problems, along with a few preliminary results, are presented in this paper
Electromagnetic scattering calculations using a finite—element solver for the Maxwell equations
We describe a pair of finite-element codes which use unstructured meshes to solve the time-dependent Maxwell equations, with particular emphasis on their application to electromagnetic scattering problems. A two-step, flux-corrected transport scheme is used to effect the time integration, while the spatial structure of the field is determined by a Galerkin procedure. The basis functions are piecewise-linear on three-noded triangles in two dimensions and four-noded tetrahedra in three. For the periodic scattering problems with which we are presently concerned, adaptive remeshing is a convenient and powerful method for improving the quality of the solutions. Results for the analytically tractable case of scattering by a perfectly conducting circular cylinder are used to illustrate the performance of the codes